SwePub
Tyck till om SwePub Sök här!
Sök i SwePub databas

  Extended search

Träfflista för sökning "hsv:(NATURVETENSKAP) hsv:(Fysik) ;pers:(Hellman Sten);srt2:(2000-2004)"

Search: hsv:(NATURVETENSKAP) hsv:(Fysik) > Hellman Sten > (2000-2004)

  • Result 1-5 of 5
Sort/group result
   
EnumerationReferenceCoverFind
1.
  • Bohm, Christian, et al. (author)
  • ATLAS Level-1 Calorimeter Trigger : Subsystem Tests of a Jet/Energy-sum Processor Module
  • 2004
  • In: IEEE Transactions on Nuclear Science. - 0018-9499 .- 1558-1578. ; 51:5, s. 2356-2361
  • Journal article (peer-reviewed)abstract
    • The ATLAS Level-1 Calorimeter Trigger consists of a Preprocessor, a Cluster Processor (CP), and a Jet/Energy-sum Processor (JEP). The CP and JEP receive digitized trigger-tower data from the Preprocessor and produce trigger multiplicities and total and missing energy for the final trigger decision. The trigger also provides region-of-interest information for the Level-2 trigger and intermediate results of the data acquisition system for monitoring and diagnostics by using Readout Driver modules. The JEP identifies and localizes jets, and sums total and missing transverse energy information from the trigger data. The Jet/Energy Module (JEM) is the main module of the JEP. The JEM prototype is designed to be functionally identical to the final production module for ATLAS and to have the full number of channels. Three JEM prototypes have been built and successfully tested. Various test vector patterns were used to test the energy summation and the jet algorithms. Data communication between adjacent JEMs and all other relevant modules of the JEP has been tested. Recent test results using the JEM prototypes are discussed.
  •  
2.
  • Bohm, Christian, et al. (author)
  • The ATLAS Level-1 Calorimeter Trigger Architecture
  • 2004
  • In: IEEE Transactions on Nuclear Science. - 0018-9499 .- 1558-1578. ; 51:3, s. 356-360
  • Journal article (peer-reviewed)abstract
    • The architecture of the ATLAS Level-1 Calorimeter Trigger system (L1Calo) is presented. Common approaches have been adopted for data distribution, result merging, readout, and slow control across the three different subsystems. A significant amount of common hardware is utilized, yielding substantial savings in cost, spares, and development effort. A custom, high-density backplane has been developed with data paths suitable for both the em/τ cluster processor (CP) and jet/energy-summation processor (JEP) subsystems. Common modules also provide interfaces to VME, CANbus and the LHC timing, trigger and control system (TTC). A common data merger module (CMM) uses field-programmable gate arrays (FPGAs) with multiple configurations for summing electron/photon and τ/hadron cluster multiplicities, jet multiplicities, or total and missing transverse energy. The CMM performs both crate- and system-level merging. A common, FPGA-based readout driver (ROD) is used by all of the subsystems to send input, intermediate and output data to the data acquisition (DAQ) system, and region-of-interest (RoI) data to the level-2 triggers. Extensive use of FPGAs throughout the system makes the trigger flexible and upgradable, and several architectural choices have been made to reduce the number of intercrate links and make the hardware more robust.
  •  
3.
  • Eerola, Paula, et al. (author)
  • Atlas Data-Challenge 1 on NorduGrid
  • 2003
  • Conference paper (peer-reviewed)abstract
    • The first LHC application ever to be executed in a computational Grid environment is the so-called ATLAS Data-Challenge 1, more specifically, the part assigned to the Scandinavian members of the ATLAS Collaboration. Taking advantage of the NorduGrid testbed and tools, physicists from Denmark, Norway and Sweden were able to participate in the overall exercise starting in July 2002 and continuing through the rest of 2002 and the first part of 2003 using solely the NorduGrid environment. This allowed to distribute input data over a wide area, and rely on the NorduGrid resource discovery mechanism to find an optimal cluster for job submission. During the whole Data-Challenge 1, more than 2 TB of input data was processed and more than 2.5 TB of output data was produced by more than 4750 Grid jobs.
  •  
4.
  • Bohm, Christian, et al. (author)
  • ATLAS TileCal Digitizer Test System and Quality Control
  • 2004
  • In: CERN Note: ATL-TILECAL-2004-009.
  • Journal article (peer-reviewed)abstract
    • This document describes the tests of the quality control procedure for the ATLAS TileCal digitizer unit. Since the ATLAS detector will fill most of the space in the detector cavity it will be difficult to disassemble parts of the detector. Therefore quality control is extremely important for every part of ATLAS. In Stockholm the digitizer units are tested in a testbench to ensure functionality and reliability in the ATLAS environment. The document includes a brief description of the ATLAS TileCal digitizer system and a description of the testbench setup. It also includes a description of the tests preformed and of the quality control procedure developed in Stockholm together with a production status.
  •  
5.
  • Bohm, Christian, et al. (author)
  • ATLAS TileCal Timing
  • 2004
  • In: CERN Note: ATL-TILECAL-2004-008.
  • Journal article (peer-reviewed)abstract
    • The ATLAS TileCal digitizer boards are designed to compensate for time delays in the calorimeter modules. In this report a method is described to determine the time delays and how to correct for them. The analysis is made using testbeam pion data from 2002 and 2003. The result shows that the sampling clock of the digitizers in the barrel calorimeter can be adjusted in such a way that there always is one sample within 1.5 ns from the pulse peak.
  •  
Skapa referenser, mejla, bekava och länka
  • Result 1-5 of 5

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view